A new feedforward neural network hidden layer neuron pruning algorithm
نویسندگان
چکیده
This paper deals with a new approach to detect the structure (i.e. determination of the number of hidden units) of a feedforward neural network (FNN). This approach is based on the principle that any FNN could be represented by a Volterra series such as a nonlinear inputoutput model. The new proposed algorithm is based on the following three steps: first, we develop the nonlinear activation function of the hidden layer’s neurons in a Taylor expansion, secondly we express the neural network output as a NARX (nonlinear auto regressive with exogenous input) model and finally, by appropriately using the nonlinear order selection algorithm proposed by Kortmann-Unbehauen, we select the most relevant signals on the NARX model obtained. Starting from the output layer, this pruning procedure is performed on each node in each layer. Using this new algorithm with the standard backpropagation (SBP) and over various initial conditions, we perform Monte Carlo experiments leading to a drastic reduction in the nonsignificant network hidden layer neurons.
منابع مشابه
A Constructive Algorithm for Feedforward Neural Networks With Incremental Training
We develop, in this brief, a new constructive learning algorithm for feedforward neural networks. We employ an incremental training procedure where training patterns are learned one by one. Our algorithm starts with a single training pattern and a single hidden-layer neuron. During the course of neural network training, when the algorithm gets stuck in a local minimum, we will attempt to escape...
متن کاملA Single Hidden Layer Feedforward Network with Only One Neuron in the Hidden Layer Can Approximate Any Univariate Function
The possibility of approximating a continuous function on a compact subset of the real line by a feedforward single hidden layer neural network with a sigmoidal activation function has been studied in many papers. Such networks can approximate an arbitrary continuous function provided that an unlimited number of neurons in a hidden layer is permitted. In this note, we consider constructive appr...
متن کاملExtracting Rules from Neural Networks by Pruning and Hidden-Unit Splitting
An algorithm for extracting rules from a standard three-layer feedforward neural network is proposed. The trained network is first pruned not only to remove redundant connections in the network but, more important, to detect the relevant inputs. The algorithm generates rules from the pruned network by considering only a small number of activation values at the hidden units. If the number of inp...
متن کاملA novel genetic-algorithm-based neural network for short-term load forecasting
This paper presents a neural network with a novel neuron model. In this model, the neuron has two activation functions and exhibits a node-to-node relationship in the hidden layer. This neural network provides better performance than a traditional feedforward neural network, and fewer hidden nodes are needed. The parameters of the proposed neural network are tuned by a genetic algorithm with ar...
متن کاملForecasting Sunspot Numbers with Neural Networks
This paper presents a feedforward neural network approach to sunspot forecasting. The sunspot series were analyzed with feedforward neural networks, formalized based on statistical models. The statistical models were used as comparison models along with recurrent neural networks. The feedforward networks had 24 inputs (depending on the number of predictor variables), one hidden layer with 20 ...
متن کامل